Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance?
نویسندگان
چکیده
This paper investigates fault tolerance in feedforward neural networks, for a realistic fault model based on analog hardware. In our previous work with synaptic weight noise we showed significant fault tolerance enhancement over standard training algorithms. We proposed that when introduced into training, weight noise distributes the network computation more evenly across the weights and thus enhances fault tolerance. Here we compare those results with an approximation to the mechanisms induced by stochastic weight noise, incorporated into training deterministically via penalty terms. The penalty terms are an approximation to weight saliency and therefore, in addition, we assess a number of other weight saliency measures and perform comparison experiments. The results show that the first term approximation is an incomplete model of weight noise in terms of fault tolerance. Also the error Hessian is shown to be the most accurate measure of weight saliency.
منابع مشابه
Note on Weight Noise Injection During Training a MLP
Although many analytical works have been done to investigate the change of prediction error of a trained NN if its weights are injected by noise, seldom of them has truly investigated on the dynamical properties (such as objective functions and convergence behavior) of injecting weight noise during training. In this paper, four different online weight noise injection training algorithms for mul...
متن کاملPenalty Terms for Fault Tolerance
This paper presents penalty terms for fault tolerance enhancement. We argue that the use of conventional learning algorithms leads to networks that have solutions that are non-optimally distributed and hence susceptible to faults. In addition we assert that fault tolerance will become an increasingly important factor in practical applications of neural networks. To address these issues we prese...
متن کاملSynaptic Weight Noise during Mlp Learning Enhances Fault-tolerance, Generalisation and Learning Tra Jectory
We analyse the eeects of analog noise on the synaptic arithmetic during MultiLayer Perceptron training, by expanding the cost function to include noise-mediated penalty terms. Predictions are made in the light of these calculations which suggest that fault tolerance, generalisation ability and learning trajectory should be improved by such noise-injection. Extensive simulation experiments on tw...
متن کاملSynaptic Weight Noise During MLP Learning Enhances Fault-Tolerance, Generalization and Learning Trajectory
We analyse the effects of analog noise on the synaptic arithmetic during MultiLayer Perceptron training, by expanding the cost function to include noise-mediated penalty terms. Predictions are made in the light of these calculations which suggest that fault tolerance, generalisation ability and learning trajectory should be improved by such noise-injection. Extensive simulation experiments on t...
متن کاملEnhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
We analyze the effects of analog noise on the synaptic arithmetic during multilayer perceptron training, by expanding the cost function to include noise-mediated terms. Predictions are made in the light of these calculations that suggest that fault tolerance, training quality and training trajectory should be improved by such noise-injection. Extensive simulation experiments on two distinct cla...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- International journal of neural systems
دوره 6 4 شماره
صفحات -
تاریخ انتشار 1995